Image Decomposition Model OSV with L 0 Sparse Regularization ?

نویسندگان

  • Guodong Wang
  • Jie Xu
  • Zhenkuan Pan
  • Weizhong Zhang
  • Zhaojing Diao
چکیده

Images decomposition attracts much attention in recent years. Many variational methods were proposed. The most famous variational decomposition model is OSV model because of its concise expression. However, the OSV model suffers from many drawbacks. One is that the discretization of the EularLagrange equation is hard because the 4th order term was introduced. The other drawback is that the effect of decomposition is weak. In this paper, we proposed a new decomposition model with L0 sparse regularization. The L0 sparse term is more sparse than traditional L1 norm. It is very useful in many image processing applications. The combination of OSV model and L0 regularization can improve the effect of traditional decomposition. To improve the efficiency of the minimization, the Split Bregman method for the improved OSV model is also designed. Experimental results validate the proposed model.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variational Image Decomposition Model OSV With General Diffusion Regularization

Image decomposition technology is a very useful tool for image analysis. Images contain structural component and textural component which can be decomposed by variational methods such as VO (VeseOsher) and OSV (Osher-Sole-Vese) models. OSV model is a powerful tool for image decomposition but the minimization is a hard problem because of solving the 4 order partial differential equations with co...

متن کامل

An efficient algorithm for adaptive total variation based image decomposition and restoration

With the aim to better preserve sharp edges and important structure features in the recovered image, this article researches an improved adaptive total variation regularization and H−1 norm fidelity based strategy for image decomposition and restoration. Computationally, for minimizing the proposed energy functional, we investigate an efficient numerical algorithm—the split Bregman method, and ...

متن کامل

Large-scale Inversion of Magnetic Data Using Golub-Kahan Bidiagonalization with Truncated Generalized Cross Validation for Regularization Parameter Estimation

In this paper a fast method for large-scale sparse inversion of magnetic data is considered. The L1-norm stabilizer is used to generate models with sharp and distinct interfaces. To deal with the non-linearity introduced by the L1-norm, a model-space iteratively reweighted least squares algorithm is used. The original model matrix is factorized using the Golub-Kahan bidiagonalization that proje...

متن کامل

Introducing oriented Laplacian diffusion into a variational decomposition model

The decomposition model proposed by Osher, Solé and Vese in 2003 (the OSV model) is known for its good denoising performance. This performance has been found to be due to its higher weighting of lower image frequencies in the H−1-normmodeling the noise component in the model. However, the OSV model tends to also move high-frequency texture into this noise component. Diffusion with an oriented L...

متن کامل

Norm Regularization Algorithm for Image Deconvolution

Up to now, the non-convex l p (0 < p < 1) norm regularization function has shown good performance for sparse signal processing. Indeed, it benefits from a significantly heavier-tailed hyper-Laplacian model, which is desirable in the context of image gradient distributions. Both l 1/2 and l 2/3 regularization methods have been given analytic solutions and fast closed-form thresholding formulae i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015